![]() APPARATUS AND SYSTEM FOR MONITORING AND METHOD FOR PERFORMING REAL-TIME MONITORING OF OXYGENATION BY
专利摘要:
devices, system and methods for tissue oximetry and perfusion imaging, a compact perfusion tester and method for characterizing tissue health status that incorporate pressure sensing components in conjunction with optical sensors are presented to monitor the level of pressure applied to tissue target and obtain accurate blood and skin/tissue perfusion measurements and oximetry; systems and methods allow for perfusion imaging and mapping (geometric and temporal), signal processing and pattern recognition, noise cancellation and merging of perfusion data, checker position and pressure readings. 公开号:BR112013018023B1 申请号:R112013018023-4 申请日:2012-01-19 公开日:2021-09-08 发明作者:Majid Sarrafzadeh;William Kaiser;Barbara Bates-Jensen;Alireza Mehrnia;Bijan Mapar;Frank Wang 申请人:The Regents Of The University Of California; IPC主号:
专利说明:
DESCRIPTIVE REPORT REFERENCE TO RELATED ORDERS [001] This application claims priority from provisional US patent application, serial number 61/434,014, filed January 19, 2011, incorporated herein by reference in its entirety. STATEMENT REGARDING RESEARCH OR DEVELOPMENT FEDERALLY SPONSORED [002] Not Applicable. INCORPORATION AS A REFERENCE OF THE SUBMITTED MATERIAL ON A COMPACT DISC [003] Not Applicable. NOTICE OF MATERIAL SUBJECT TO THE PROTECTION OF RIGHTS AUTHORSHIPS [004] A portion of the material in this patent document is subject to copyright protection under the copyright laws of the United States and other countries. The copyright owner does not object to the facsimile reproduction by anyone of the patent document or patent disclosure as it appears in publicly available US Patent and Trademark Office documents or records, but of otherwise, absolutely all copyright is reserved. The copyright owner hereby does not waive any of its rights to have this patent document kept secret, including without limitation its rights under 37 C.F.R. § 1.14. HISTORY OF THE INVENTION [005] 1. Field of Invention [006] This invention relates, in general, to tissue oximetry and, more particularly, to tissue oximetry and perfusion imaging. [007] 2. Description of Related Art [008] The integrity of patients' skin has long been a matter of concern for nurses and nursing homes. Maintaining skin integrity has been identified by the American Nurses Association as an important indicator of quality nursing care. Meanwhile, ulcers, and specifically venous and pressure ulcers, remain major health problems, particularly for hospitalized elderly people. Detecting early wound formation is an extremely challenging and costly problem. [009] When age is considered along with other risk factors, the incidences of these ulcers are significantly higher. The overall incidence of pressure ulcers for hospitalized patients ranges from 2.7% to 29.5% and rates greater than 50% have been reported for patients in intensive care settings. In a retrospective multicenter cohort study of 1,803 elderly patients discharged from intensive care hospitals with selected diagnoses, 13.2% (ie, 164 patients) demonstrated an incidence of stage I ulcers. Of those 164 patients, 38 (16) %) had ulcers that progressed to a more advanced stage. [0010] Pressure ulcers were additionally associated with an increased risk of death within one year of hospital discharge. The estimated cost of treating pressure ulcers ranges from $5,000 to $40,000 for each ulcer, depending on the severity. Meanwhile, venous ulcers can also cause significant health problems for hospitalized patients, particularly the elderly. Approximately 3% of the population suffer from leg ulcers, even though that number rises to 20% in those over 80 years of age. The average cost of treating a venous ulcer is estimated at $10,000, and it can easily rise to $20,000 without effective treatment and early diagnosis. [0011] Once a patient has been affected by a venous ulcer, the probability of the wound recurring is also extremely high, and ranges from 54% to 78%. This means that venous ulcers can have severe negative effects for those who suffer from them, significantly reducing the quality of life and requiring extensive treatment. The impact of venous ulcers is often underestimated, despite accounting for 2.5% of the total health budget. [0012] The high cost and incidence rates of venous ulcers, together with the difficulty of treating them, mark an extremely good opportunity to introduce a low-cost, non-invasive system capable of providing early detection. Even though traditional laser Doppler systems are capable of producing relatively accurate and reliable information, they cannot be used for continuous monitoring of patients as they require bulky and extremely expensive equipment. Those solutions that are too expensive or difficult to implement significantly limit their adoption. [0013] Therefore, there is a need to develop a preventive and monitoring solution to verify (scan) tissue and measure tissue perfusion status as a measure of the level of oxygen distribution and penetration through tissue as an indicator of health of the fabric. Consequently, an object of the present invention is the use of photoplethysmography in conjunction with pressure sensor signals to monitor the perfusion levels of patients suffering from or at risk for developing venous ulcers. BRIEF SUMMARY OF THE INVENTION [0014] The systems and methods of the present invention include a compact perfusion scanner configured to verify and map tissue blood perfusion as a means to detect and monitor the development of ulcers. The device incorporates a platform, a digital signal processing unit, a serial connection with a computer, pressure sensor, pressure measurement system, a pair of LED and photodiode sensor and a visual interface that explores data. [0015] The systems and methods of the present invention offer effective preventive measures allowing the early detection of ulcer formation or inflammatory pressure that would otherwise not have been detected for a prolonged period, thereby increasing the risk of infection and the development of higher stage ulcers. [0016] In a preferred embodiment, the compact perfusion checker and method for characterizing tissue health status according to the present invention incorporate pressure sensing components together with optical sensors to monitor the level of pressure applied to the target tissue to accurate measurements of cutaneous/tissue blood perfusion and oximetry. The systems and methods of the present invention allow for new capabilities, including, but not limited to: measurement capabilities such as perfusion imaging and perfusion mapping (geometric and temporal), signal processing and pattern recognition, automatic usage assurance through of usage tracking and pressure imaging, as well as data fusion. [0017] A particular benefit of the sensor-enhanced system of the present invention is the ability to better individually control each patient, resulting in a more timely and efficient practice in hospitals and even in nursing homes. This is applicable to patients with a history of chronic wounds, diabetic foot ulcers, pressure ulcers or post-operative wounds. [0018] In addition, changes in signal content can be integrated with the patient's activity level, the patient's body position, and standardized symptom assessments. By keeping the data collected from these patients in a database of signs, pattern classification, pattern search, and combined pattern algorithms can be used to better map symptoms with changes in skin characteristics and the development of ulcers. [0019] An aspect is a device for monitoring perfusion oxygenation of a target tissue region of a patient, comprising: a tester comprising: a planar array of sensors; the array of sensors configured to be positioned in contact with the surface of the target tissue region; the sensor array comprising one or more LEDs configured to emit light in the tissue-target region at a hemoglobin-encoded wavelength; the array of sensors comprising one or more photodiodes configured to detect light reflected from the LEDs; and a data acquisition controller coupled to one or more LEDs and one or more photodiodes to control the emission and reception of light from the sensor array to obtain perfusion oxygenation data associated with the tissue target region. [0020] Another aspect is a system for monitoring perfusion oxygenation of a target tissue region of a patient, comprising: (a) a tester comprising: a planar array of sensors; the array of sensors configured to be positioned in contact with the surface of the target tissue region; the sensor array comprising one or more light sources configured to emit light in the target tissue region at a hemoglobin encoded wavelength; the sensor array comprising one or more sensors configured to detect light reflected from the light sources; a pressure sensor coupled to the sensor array; the pressure sensor configured to obtain pressure readings from the contact of the sensor array with the surface of the target tissue region; and (b) a data acquisition controller coupled to one or more sensors and for controlling the emission and reception of light from the sensor array to obtain perfusion oxygenation data associated with the target tissue; and (c) a processing module coupled to the data acquisition controller; (d) the processing module configured to control the sampling of the pressure sensor and sensor array for simultaneous acquisition of perfusion oxygenation data and pressure sensor data to ensure proper contact of the tester with the surface of the tissue region. target. [0021] An additional aspect is a method to accomplish the. real-time monitoring of perfusion oxygenation of a target tissue region of a patient, comprising: positioning an array of sensors in contact with the surface of the target tissue region; emitting light from the light sources of the sensor array in the tissue-target region at a hemoglobin-encoded wavelength; receiving light reflected from light sources; obtaining the pressure data associated with the contact of the sensor array with the surface of the target tissue region; obtaining perfusion oxygenation data associated with the target tissue region; and sampling perfusion oxygenation data and pressure data to ensure proper contact of the sensor array with the surface of the target tissue region. [0022] It is appreciated that the systems and methods of the present invention are not limited to the specific condition of ulcer or wound, but that they may have wide application in all forms of control or treatment of wounds such as skin diseases. [0023] Additional aspects of the invention will be emphasized in the following parts of the specification, where the detailed description is intended to fully reveal preferred embodiments of the invention without placing limitations thereon. BRIEF DESCRIPTION OF THE VARIOUS VIEWS OF THE DRAWINGS) [0024] The invention will be better understood with reference to the following drawings, which are for illustrative purposes only: [0025] FIG. 1 shows a preferred embodiment of a perfusion oxygenation (POM) monitoring system for analyzing the tissue region in accordance with the present invention; [0026] FIGs. 2A and 2B illustrate front and right perspective views of the printed circuit board of the perfusion of the present invention; [0027] FIG. 3 illustrates an emitter with the present invention; [0028] FIG. 4 illustrates a circuit with the present invention; [0029] FIG. 5 illustrates an exemplary photodiode readout circuit configured to read the signal from the photodiode sensor array; [0030] FIG. 6 illustrates a calibration setup for pressure sensor calibration; [0031] FIG. 7 shows a graph of results from pressure verification studies of 50 g, 100 g, 200 g, and 500 g weights on a single sensor; [0032] FIG. 8 is a graph showing the measured pressure response curve, the interpolated (exponential) curve, and the point where the pressure sensor is specified to saturate; [0033] FIG. 9 shows the results of pressure verification studies on a second 1-lb sensor; [0034] FIG. 10 is a graph showing raw pressure response curves and various settings; [0035] FIG. 11 illustrates the configuration of a PC to run the perfusion oxygenation (POM) monitoring system of the present invention; [0036] FIG. 12 shows an image of the interface image of the hardware configuration module according to the present invention; [0037] FIG. 13 shows an image of the graphical user interface according to the present invention; [0038] FIG. 14 shows an exemplary interpolation performed by means of a Kriging algorithm; [0039] FIG. 15 shows a schematic diagram of a marker pattern used to test the feature extraction module; [0040] FIG. 16 illustrates the configuration of figure 15 superimposed on an image; [0041] FIG. 17 illustrates a block diagram of a method for generating an image of the mapped and interpolated perfusion; [0042] FIG. 18 shows an example of heterodyning used to help eliminate in-band noise in accordance with the present invention; [0043] FIG. 19 is a graph of the theoretical response of the subtraction method of figure 18 in relation to noise and correction frequency; [0044] FIG. 20 is a graph of the frequency response of the subtraction method shown on a scale of Db; [0045] FIG. 21 shows the results of employing noise subtraction on a high frequency LED conduction signal, and several average LED conduction periods to obtain similar data rates as above; [0046] FIG. 22 illustrates an enlarged view of Figure 21; [0047] FIG. 23 shows a sample of the time domain signals used to compare neck and thumb tissue measurements; [0048] FIG. 24 shows the frequency domain representation of the measured signals; [0049] FIG. 25 shows the results of plethysmographic signals extracted from the forehead; [0050] FIG. 26 shows a comparison of plethysmographic signal readings taken from the underside of the thumb joint; [0051] FIG. 27 shows the results of variable pressure using the neck reflectance sensor; and [0052] FIG. 28 shows the results both above and to the side of the black ribbon; DETAILED DESCRIPTION OF THE INVENTION [0053] Figure 1 shows a preferred configuration of a perfusion oxygenation (POM) monitoring system 10 for analyzing tissue region 52 of a patient 18 in accordance with the present invention. System 10 generally comprises six Main Components: red/infrared LED array 44, photodiode array 46, pressure sensor 50, pressure measurement system 48 (which includes amplification and filtration circuit), data acquisition unit 40, digital signal processing module 12 and application module 14 featuring a user interface. [0054] The system 10 comprises the detection component 16 which includes arrays of emitters/sensors (44, 46, 50) and data acquisition unit 40, preferably in a portable device (not shown). The LED array 44 and photodiode arrays 46 coupled to the data acquisition unit 40 (for example, via cabling or wireless connection) can be physically configured in a variety of arrays. Data acquisition unit 40 is preferably capable of interfacing with a large number of individual LEDs and photodiodes. The signal amplification and filtering unit 49 may be used to condition the photodiode signal/data before it is received by the data acquisition unit 40. In a preferred embodiment, the photodiode signal amplification and filtering unit 49 may comprise a photodiode 120 readout circuit shown in Figure 5 and described in more detail below. [0055] The detection/verification component 16 shall also include an intensity controller 42 to control the output of the intensity controller LED arrangement 42 preferably comprises the LED driver circuit 100 shown in Figure 4 and described in more detail below. [0056] The data acquisition system 40 also interfaces with the application module 14 on PC 154 (see Figure 11), allowing a user to configure the LED array 44 signaling as well as sampling the array's signal rate of the photodiode 46 via a hardware configuration module 34 which is displayed via the graphical user interface 36. The data acquired from the DAC 40 is preferably stored in a database 32 for subsequent processing. [0057] The pressure sensor 50 is configured to measure the applied pressure of the hardware package 16 on the patient's tissue so that pressure readings can be acquired to maintain consistent and appropriate pressure on the skin 52 while the measurements are being taken . Pressure sensor 50 may be coupled to preconditioning or measurement circuitry 48 which includes amplification and filtration circuitry to process the signal before it is received by data acquisition controller 40. [0058] The LED arrays 44 are configured to project light at wavelengths encoded for hemoglobin into the target tissue 52, and the photodiode sensor arrays 46 measure the amount of light that passes through the tissue 52. [0059] The signal processing module 12 then further processes and filters the data acquired through processing scripts 24 and the filtering module 22. The signal processing module 12 further comprises a characteristic extraction module 28, which can be generated to the visual interface 36 for further processing and visualization. A perfusion data module 26 converts the data into a Plethysmographic waveform, which can be displayed on a monitor or similar device (not shown). The interface 36 and the processing module 12 can also be configured to generate an overlay tissue image of the captured perfusion data 26. [0060] In order to produce the wavelengths of light corresponding to the absorption of deoxy and oxyhemoglobin, system 12 preferably uses light emitting diodes for emitting source arrangement 44. In a preferred embodiment, system 10 incorporates the combinations of Dual optical emitter DLED-6.60/880-CSL-2 from OSI Optoelectronics. This dual emitter combines a red (660 nm) and infrared (880 nm) LED in a single package. Each pair of red/infrared LEDs requires a current source of 20 mA and has a forward voltage of 2.4/2.0 V respectively. It is considered that other light sources can also be used. [0061] In order to measure a photoplethysmograph, light reflected from the array of LEDs 44 is detected by the array of photodiodes 46. In a preferred configuration, the PIN-8.0-CSL photodiode from OSI Optoelectronics is used. This photodiode has a spectral range of 350 nm to 1100 nm and a responsivity of 0.33 and 0.55 at 660 nm and 900 nm of light, respectively. [0062] Figures 2A and 2B illustrate the front and right perspective view of the printed circuit board (PCB) 60 of the perfusion. PCB 60 comprises an array of LEDs 44 of two pairs of LEDs 64 spaced between two arrays 46 of photodiodes 62. Board 60 also comprises a pressure sensor 50 for monitoring pressure applied to target tissue 52. [0063] As shown in Figure 2A, the optical sensors (for example, the LED array 44 and the photodiode array 46) are located on the front side 66 of the PCB 60 and are configured to face forward and apply pressure (direct or adjacent to the clear cap (not shown)) in the target tissue 52. [0064] Referring to Figure 2B, the conduction circuits, for example, the connector head 70, are located on the rear side 68 of the PCB 60 safely out of contact with the individual under test, and the front of the PCB (right ) that houses the sensor part of the array. Arrangements 44, 46 are located so that the connector head 70 and corresponding contacts 72 and cables 74 (which couple the data acquisition unit 40) do not interfere with the use of the device. [0065] The arrays 44, 46 are shown in Figure 2A as two LEDs 64 positioned between four photodiodes 62. However, it is appreciated that the array may comprise any number and planar configuration of at least one LED 64 emitter and one photodiode receiver . [0066] Figure 3 illustrates an exemplary 64 LED emitter (DLED-660/880 CSL-2 from OSI Optoelectronics) with a 660nm 84n red emitter and an 82nm 880nm Infrared emitter. [0067] Figure 4 illustrates a lead circuit of the LED 100 according to the present invention. The LED 100 driver circuit is configured to allow the red LED 88 and the infrared LED 82 in the LED package 64 to be driven independently, even though the LEDs are common anode, sharing a connection. VDD through contacts 80. [0068] Conductor circuit 100 includes a low noise amplifier 110 coupled to LED 64. In a preferred embodiment, amplifier 110 comprises an LT6200 chip from Linear Technologies. However, it is appreciated that other amplifiers available in the art can also be used. The LED 100 driver further comprises a 112 p-channel MOS field effect transistor (FET) (e.g., Panasonic's MTM76110), which provides negative feedback. As the voltage increases at the input, so does the voltage at the 50 ohm resistor 102. This results in more current being drawn through LED 64, making it brighter. At 2V, approximately 40mA is consumed via LED 64, providing optimal brightness. If the input voltage is too high, the voltage drop across LED 64 will be insufficient to turn it off, but there will still be a large amount of current flowing through LED 64 and resistor 102, resulting in large heat build-up. For this reason, the input voltage is ideally kept below 3V to minimize overheating and prevent component damage. If the input to operational amplifier 110 oscillates while amplifier 110 is powered, a 100k104 pull-down resistor at the input and a 1k108 load resistor at the output ensure that circuit 100 remains off. The 1k 108 load resistor also ensures that amplifier 110 is capable of providing rail-to-rail output voltage. The 1 uF capacitor 114 ensures that the output remains stable, but provides enough bandwidth for fast switching of LED 64. In order to provide additional stabilization, the conductor circuit 100 can be modified to include Miller compensation in capacitor 114 This change improves the phase margin of the conductor circuit 100 at low frequencies, allowing for more reliable operation. [0069] Figure 5 illustrates an exemplary photodiode readout circuit 120 configured to read the signal from photodiode sensor array 46. In a preferred embodiment, photodiode 62 may comprise an OSI Optoelectronics PIN-8.0-DPI photodiode, a photodiode PIN-4.0DPI, or alternatively a PIN-0.8-DPI photodiode which has lower capacitance for reverse bias voltage. [0070] The photodiode 120 readout circuit operates through a simple operational amplifier configured as a current-to-voltage converter 124 as shown in Figure 14. The positive input pin of operational amplifier 124 (for example, LT6200 from Linear Technologies) it is driven by a voltage divider 122, providing 2.5 V. (half the VDD). The negative pin is connected to photodiode 62, which is reverse biased, and via feedback to the output of amplifier 124. [0071] The feedback is controlled by a simple low-pass filter 126 with a 2.7 pF capacitor 129 and a 100 kiloohm resistor 130. The 0.1 uF 128 capacitor is used to decouple the voltage divider from ground . The circuit amplifies the current output of the photodiode and converts it to voltage, allowing the data acquisition unit to read the voltage through its voltage input module. [0072] It is appreciated that the individual components of the LED 100 conductor circuit and the photodiode 120 read circuit are shown for exemplary purposes only and that other models or types of components may be used as desired. [0073] In an embodiment of the present invention, the data acquisition controller comprises the National Instruments CompactRIO 9014 real-time controller coupled to an NI 9104 PFGA chassis with a 3M gate. The data acquisition controller 40 interfaces with the LED 44 and photodiode 46 arrays using three sets of modules for current output, current input and voltage input. [0074] In one configuration, controller 40 comprises a processor, a real-time operating system, a memory, and supports additional storage via USB (all not shown). Controller 40 may also include an Ethernet port (not shown) for connection to user interface PC 154. Controller 40 comprises an FPGA backplane, a current output module (eg, NI 9263), a module a current input module (eg NI 9203) and a voltage input module (eg NI 9205) allowing multiple voltage inputs from the photodiode/amplifier modules. [0075] The POM 10 system preferably employs a pressure sensor 50 to measure pressure and ensure consistent results (eg a 1lb Flexiforce sensor). Due to the confusing effect that pressure variation can have on plethysmographic measurements, pressure sensor 50 readings provide a metric from which the user can apply sensor 16 to the patient's skin 52. [0076] The pressure sensor 50 is preferably affixed behind the array of LEDs 44, and measures the pressure used in its application at a target location. Pressure sensor 50 is preferably configured to deliver accurate pressure measurements within a specified range, for example a range from zero to approximately one pound, which covers the range of pressures that can reasonably be applied when using the POM detector device 16 . [0077] The pressure sensor 50 is used to guide the user in operating the tester 16 more consistently so that the sensor/checker 16 is positioned similarly for each measurement. The data collected from the oximeter is thus verified to be accurately collected by the pressure sensor 50 readings. [0078] In a preferred configuration, the pressure sensor 50 is calibrated in order to ensure that the pressure sensor presents repeated and well-understood measurements, which can be directly translated into raw pressure values. Figure 6 illustrates a calibration setup 140 for calibrating the pressure sensor 50. A rubber pressure applicator 144 was placed on a flat surface and used to distribute the weight over the sensing region of the Flexiforce sensor 50. A weight 142 was used to distribute the weight over the active region of the sensor 50. An experiment was carried out using 4 weights with a range of 50 g to 500 g. Pressure was applied directly to pressure sensor 50 via applicator 144, and its results recorded. [0079] The results in figures 7-10 show a non-linear but regular trend whose data can be used to translate any pressure sensor measurement into an absolute pressure value. [0080] Figure 7 shows a graph of results of pressure verification studies of 50 g, 100 g, 200 g and 500 g weights on a single sensor. Figure 8 is a graph showing the measured pressure response curve, the interpolated (exponential) curve, and the point where the pressure sensor is specified to saturate.' Figure 9 shows the results of pressure verification studies in a second 1 pound sensor. For this experiment, additional intermediate weight levels (eg 150 g and 300 g) were applied. Figure 10 is a graph showing raw pressure response curves and various adjustments. The exponential fit serves as the best fit for both sensors tested. [0081] Even though the system 10 ideally uses the data from the pressure sensor 50 to verify the proper placement of the tester at the target tissue location 52, it is appreciated that in an alternative configuration the user can simply forgo monitoring of pressure and monitor it manually (eg, tactile sensation or simply placing the tester 16 in tissue site 52 under gravity). [0082] Referring to Figure 11, the user preferably interacts with the data acquisition and the control unit 40 by means of a PC 154 running the processing module 12 and the application module 14 comprising the graphical user interface 36 ( for example, LabVIEW or similar). In a preferred configuration, PC 154 communicates with data acquisition unit 40 via an Ethernet connection (not shown). Alternatively, PC 154 communicates with data acquisition unit 40 via a wireless connection (not shown) such as WIFI, Bluetooth, etc. Data files generated in data acquisition unit 40 can also be transferred to PC 154 via an FTP connection for temporary storage and further processing. [0083] With respect to the PC interface 154 shown in Figure 11, the individual LEDs 64 of the LED array 44 project light with wavelengths encoded to hemoglobin, and the photodiode sensors 62 measure the amount of light that passes through and is reflected of tissue 52. The data acquisition unit 40 generally comprises a digital TTL output 152 coupled to LEDs 64 and an analog DC input 150 to photodiodes 62. The signal processing module 12 then further processes and filters such data which they are then transmitted to the graphical user interface 36 for further processing and viewing. The data can then be converted to a Plethysmographic waveform for display. [0084] Figure 12 shows an image 160 of the hardware configuration module interface 34. Inputs can be selected to adjust the parameters of the array of LEDs 44 in fields 166, the voltage channel settings in fields 164, the settings of the current channel in fields 162, in addition to other parameters such as the sampling period, the pressure sampling period, etc. [0085] Figure 13 shows an image 170 of the graphical user interface 36 which also serves as data management and explorer to allow a user to easily read perfusion sensors, and observe various signals. Image 170 shows the integration of the data captured from the blood oximetry sensors (photodiode array 46 and LED array 44), the pressure sensor 50 and the tracking/position data captured by checking the photodiode array 46 and the LED array 44. Image 170 shows a first window 172 and displays the Plethysmographic waveform (2 seconds shown in Figure 13), and a second window 174 showing the absolute movement of the x and y axes that was performed with the verifier. The graphical user interface 36 is also capable of mapping this to the measured SPO2 data (eg by switching one of the display windows 172 and 174) . Bar 176 to the right of image 170 is the gauge of pressure sensor readings 50, showing approximately half of the maximum pressure being applied. Gauge 176 preferably shows how much pressure the user is applying versus the maximum pressure measurable in a color-coded bar (the more pressure applied, the bar changes from blue to green to red). Pressure gauge 176 is preferably mapped to pressure values suitable for different locations. [0086] In order to provide a more informative map of perfusion in a local region, the interpolation of blood oximeter data can be performed using the sensor tracking data. The oximeter's optical sensor 16 provides absolute SPO2 readings, showing the percentage of blood that is oxygenated. This information, when associated with where it was collected, can be used to generate a blood oxygenation map. In a preferred configuration, the LED array. 44 used to generate SpO2 readings is also used to determine location. However, it is appreciated that another optical sensor, e.g. laser (not shown) can be used to obtain location readings independently of the LED SpO2 readings. In this configuration, a low-power laser (similar to a laser mouse) is used to generate an image of a small area at very fast intervals and then detect the movement of how that image was swapped. This information is then converted to two dimensional displacement and position measurements ‘X’ and ‘Y’. [0087] In a preferred configuration, the interpolation is performed using a Kriging algorithm and the data points are mapped using the oximeter sensor 16 to track the movement of the sensor 16 across the test area. Kriging is a linear least squares interpolation method often used for spatially dependent information. Interpolation is used to fill in white spots that a check might have missed with estimated values. The interpolated data is compiled into a color coded image and displayed to the user. This allows for precise, anisotropic interpolation of the raw data, which makes the final result much easier to visualize. An example of interpolation is shown in Figure 14. The movement of sensor 16 was generally one-dimensional in this example, resulting in a linear trend through the x-axis. This is due to the low variance of points in that direction (note the total displacement of approximately 40 in the X direction compared to 1400 in the Y direction). [0088] In order to assist in the visualization of collected blood oximetry data, the processing software 12 preferably includes a feature extraction module 28 that can detect markers in a photograph and then properly align and overlay the data from the blood oximetry 26 (see figures 1, 17). In a preferred method, the feature extraction module 28 takes images (eg photographs taken from a camera of the verification site), and overlays the perfusion data directly over the location from which they were taken. [0089] Figure 15 shows a schematic diagram of a marker pattern 200 used to test feature extraction module 28. Figure 16 illustrates the configuration of figure 15 superimposed on an image 205. Three markers (202, 204 and 206 ) were used as boundary points for a particular verification area 208. A first marker 202 was used to determine the rotation angle of the image. A second marker 206 was used to determine the left edge (image position) of the image. A third marker 204 was used to determine image width. The markers (202, 204 and 206) can be any color, but green is the ideal color as it is easily distinguishable from all skin tones. For a clear illustration of the feature extraction software, small green plastic boxes were used to represent the points 202, . 204 and 206 (see Figure 16), and image 205 was quickly edited to put three of them in a likely pattern. With the exception of this manipulation, all other images were generated in real time by the software. A 208 grid was used as sample data, to more clearly illustrate what is being done by the tool. [0090] In one configuration, a mobile app (not shown) can be used to facilitate the capture and integration of photographs into processing software 12. The application also allows a user to quickly take a photograph with a mobile device (by a smartphone, or similar device) and automatically send it via Bluetooth for capture by the processing software 12. The photograph can then be integrated with the mapping system. [0091] Figure 17 illustrates a block diagram of a method 220 for generating a mapped and interpolated perfusion image (for example, with the processing module 12). An example code for performing method 220 can be found in the Source Code Annex attached to this document. It is appreciated that the code provided is simply an example of how to carry out the methods of the present invention. [0092] The data acquired from the data acquisition unit 40 (which can be stored on the server 32) is first extracted in step 222 (by means of Processing Scripts 24). This extracted data is then used to simultaneously extract location data, perfusion data and pressure data from each measurement point. The processing software 12 can simultaneously sample the location, perfusion, and pressure readings (eg, in the 3 Hz range) in order to create a combined set of blood pressure, position, and oxygen measurements at each range. [0093] In order to generate useful information and metrics for the raw data recorded by the perfusion module 228, several algorithms can be used. [0094] In step 230, the features are extracted from the data (for example, through the feature extraction module 28). Position data corresponding to the location of sensor 16 is then mapped in step 232. After the verification has been completed, the oximetry data is mapped in step 234 to appropriate coordinates corresponding to the sensor position data obtained in step 232 In step 236, the mapped data is interpolated (eg using the Kriging algorithm shown in Figure 14). The interpolated data can be compiled into a color-coded image, and displayed to the user, and/or the data from the perfusion can then be superimposed on a background image (eg image 205) of the verification site as described in Figures 15 and 16. [0095] On the perfusion side, RF noise filtering is then performed on the data extracted in step 224. The motion noise is then removed in step 226 to obtain the perfusion data in step 228. The steps 224 and 226 can be carried out by means of the filtration module 22. [0096] In a preferred method illustrated in Figure 18, heterodyning is used to help eliminate noise in the band. The data recorded from when the. LED arrays 44 are off are subtracted from adjacent data when LED arrays 44 are on (subtraction method). This creates high frequency noise but removes low frequency in-band noise, which is a bigger issue. The additional high-frequency noise that is introduced is then filtered out by a low-pass filter. Algorithms are configurable to allow for the preservation of high frequency information from PPG signals. [0097] As illustrated in Figure 18, relevant noise information from areas marked 1 and 2 are used to calculate the noise that appears in area 3. This can be done by the one-way method or by the two-way method. [0098] For the one-way method, only the preceding noise information from area 1 is used, and it is assumed that the relevant noise level is the same in areas 1 and 3. For the two-way method, the average noise from areas 1 and 2 is calculated. Finally, the noise interpolation in 3 is attempted by means of interpolation, using data from all available noise periods, preceding and after the target data point (3). The measurement data is averaged over these areas to generate a single point for each pulse of the LED 64. The result is then filtered by a low pass filter at the end to remove high frequency noise. [0099] Figure 19 is a graph of the theoretical response of the subtraction method of figure 18 in relation to the noise frequency and correction, determined by adding sinusoidal noise of a wide frequency range to a square wave signal, applying the method of noise cancellation (correction method), and measuring the proportion of the remaining noise in relation to the original noise. Measurements were averaged across all phases for a given frequency. Figure 20 is a graph of the frequency response of the subtraction method shown on a dB scale. [00100] For the frequency response graphs shown in figures 19 and 20, the frequency is normalized to the frequency of the simulated LED conduction signal, with 1 meaning the noise is at the same frequency as the conduction signal and 2 meaning that it is twice the driving frequency, and so on. [00101] Figures 21 and 22 are graphs showing the plethysmographic signals extracted employing the above-mentioned noise cancellation method (subtraction) from Figure 18 on a high-frequency LED conduction signal compared to the scenario when the cancellation technique noise is performed. Figure 21 shows the results of employing noise subtraction on a high-frequency LED conduction signal, and several average LED conduction periods to obtain similar data rates as above. Observe successful noise reduction in approximately 1.5 s. Figure 22 is an enlarged version of Figure 21, showing the noise peak that is removed by differential noise subtraction. These graphs show that the noise subtraction method of the present invention is effective in removing noise in the band. [00102] Frequency domain analysis/experiments were performed with frequency domain signals from the plethysmographic measurements. The experiments revealed not only high-magnitude elements in heart rate, but also its harmony. This seems reasonably consistent across locations. [00103] In order to verify if the harmony shown in frequency domain 1 was not the result of noise or instability, but if it represented real components of the pulse waveform, a sine wave was elaborated. The sine wave was created by adding sine waves at the frequency of each separate pulse waveform peak. This overlay was intended to model the effects of frequency instability on the waveform, removing any frequency components due to the shape of the pulse waveform. [00104] A comparison of signals is shown in figures 23 and 24. Figure 23 shows a sample of the time domain signals used for comparison. Neck measurements were compared to thumb measurements with equal pressure. Figure 24 shows the frequency domain representation of the measured signals. Note the second harmonic at 128 BPM (2.13 Hz), the third harmonic at 207 BPM (3.45 Hz), etc. The results demonstrate that the harmonics shown below are indeed intrinsic to the pulse waveform, and are not the result of noise or frequency instability. [00105] Experiments were performed on various sites of the body, including the neck, thumb and forehead using the perfusion system 10 of the present invention. Samples of the extracted plethysmographic signals are reported in Figures 25-27, which clearly show that the perfusion system successfully removes movement and ambient noise and extracts the plethysmographic signal from different locations in the body. [00106] Figure 25 shows the results of the plethysmographic signals extracted from the forehead. Pressure values are displayed in terms of resistance measured using the pressure sensor. Lower resistances indicate higher applied pressures. [00107] Figure 26 shows a comparison of plethysmographic signal readings taken from the underside of the thumb joint. All factors except pressure were held constant between measurements. Moderate pressure clearly results in a better waveform. [00108] Figure 27 shows the results of variable pressure using the neck reflectance sensor. The experiments below show the importance of integrating and fusing the applied pressure with the perfusion signal in this system, since the pressure with which the sensor array is applied to the target tissue has the greatest impact on the perfusion readings as shown in the following figures. It appears that the neck and thumb do best when moderate pressure (0.15 M to 70 k-ohm) is applied, while the forehead does better with low pressure (above 0.15 M-ohm). This may be a result of the neck and thumb being a softer tissue than the forehead. [00109] Perfusion set 10 was also tested on a black tape as a means of marking tissue sites. Black tape was used to test a skin marker. The sensor was used to measure the signals on the tape, and right next to it. A skin impression can be seen where the reflectance sensor has been used outside the tape. [00110] Figure 28 shows the results both above and to the side of the black tape. The results show that the use of a simple piece of black tape is efficient in causing large differences in the signal and can, therefore, be used as a marker for specific places in the body. [00111] Features of the present invention can be described with reference to flowchart illustrations of methods and systems according to embodiments of the invention, and/or algorithms, formulas, or other computational descriptions, which can also be implemented as products of computer programs . In this regard, each block or step of a flowchart, and combinations of blocks (and/or steps) of a flowchart, algorithm, formula or computational description can be implemented by various means such as hardware, firmware and/or software including one or more computer program instructions embedded in the logic of computer-readable program code. As will be appreciated, any such computer program instructions can be loaded into a computer, including without limitation a general purpose computer or a special purpose computer, or other programmable processing devices to produce a machine such that the instructions of the computer programs that run on the computer or other programmable processing devices create a means to implement the functions specified in the block(s) of the flowchart(s). [00112] Consequently, blocks of flowcharts, algorithms, formulas or computational descriptions support combinations of means to carry out the specified functions, combinations of steps to carry out the specified functions, and computer program instructions, as incorporated in logical means of codes of computer-readable programs to perform the specified functions. It will also be understood that each block of flowchart illustrations, algorithms, formulas or computational descriptions and combinations thereof described herein may be implemented by computer systems based on special purpose hardware that perform the specified functions or steps, or combinations of hardware of special purpose and computer readable program code logic means. [00113] In addition, these computer program instructions, as embedded in the logic of the computer-readable program code, may also be stored in a computer-readable memory that can trigger a computer or other programmable processing device to function in a particular way. way, so that the instructions stored in memory produce an article of manufacture including instruction means that implement the function specified in the block(s) of the flowchart(s). Computer program instructions can also be loaded into a computer or other programmable processing device to cause a series of operational steps to be performed on the computer or other programmable processing device to produce a computer-implemented process such that the instructions performed on the computer or other programmable processing device provide steps to implement the functions specified in the block(s) of the flowchart(s), algorithm(s), formula(s) or computational description(s). is). [00114] From the above discussion, it will be appreciated that the invention may be incorporated in a number of ways, including the following: [00115] 1. A device for monitoring perfusion oxygenation of a target tissue region of a patient, comprising: a tester comprising: an array of planar sensors; the array of sensors configured to be positioned in contact with the surface of the target tissue region; the sensor array comprising one or more LEDs configured to emit light in the tissue-target region at a hemoglobin-encoded wavelength; the array of sensors comprising one or more photodiodes configured to detect light reflected from the LEDs; and a data acquisition controller coupled to one or more LEDs and one or more photodiodes to control the emission and reception of light from the sensor array to obtain perfusion oxygenation data associated with the tissue target region. [00116] 2. The device of configuration 1, the tester further comprising: a pressure sensor coupled to the sensor array; the pressure sensor configured to obtain pressure readings from the sensor array contacting a surface of the target tissue region; where the tester is configured to take pressure sensor readings by obtaining perfusion oxygenation data to ensure proper tester contact with the surface of the target tissue region. [00117] 3. The device of configuration 2: where the pressure sensors and the sensor array are connected to a first side of a printed circuit board (PCB); and where the data acquisition controller is connected to the PCB on a second side opposite said first side. [00118] 4. The device of configuration 1, where each LED comprises dual emitters configured to emit red (660 nm) and infrared (880 nm) light. [00119] 5. The device of configuration 4: where one or more of the LEDs are coupled to the lead circuit; and where the lead circuit is configured to allow the red LED and the infrared LED to be independently driven sharing a common anode. [00120] 6. The device of configuration 5, where the conducting circuit comprises an amplifier; and a field effect transmitter configured to provide negative feedback. [00121] 7. The device of configuration 2, further comprising: a processing module coupled to the data acquisition controller; the processing module configured to control the sampling of the pressure sensor and sensor array for simultaneous acquisition of pressure sensor data and perfusion oxygenation data. [00122] 8. The device of configuration 7, where the processing module is configured to obtain readings from the sensor array to obtain data from the position of the verifier. [00123] 9. The device of configuration 8, where the processing module is configured to generate a map of the oxygenation of the target tissue perfusion. [00124] 10. The device of configuration 8, where the processing module is configured to control the mapping of the pressure sensor and sensor array for simultaneous acquisition of two or more data parameters selected from the group consisting of sensor data pressure, perfusion oxygenation data and position data to simultaneously display said two or more data parameters. [00125] 11. A system for monitoring perfusion oxygenation of a target tissue region of a patient, comprising: (a) a tester comprising: a planar array of sensors; the array of sensors configured to be positioned in contact with the surface of the target tissue region; the sensor array comprising one or more light sources configured to emit light in the target tissue region at a hemoglobin encoded wavelength; the sensor array comprising one or more sensors configured to detect light reflected from the light sources; a pressure sensor coupled to the sensor array; the pressure sensor configured to obtain pressure readings from the contact of the sensor array with the surface of the target tissue region; and (b) a data acquisition controller coupled to one or more sensors and for controlling the emission and reception of light from the sensor array to obtain perfusion oxygenation data associated with the target tissue; and (c) a processing module coupled to the data acquisition controller; (d) the processing module configured to control the sampling of the pressure sensor and the arrangement of sensors for simultaneous acquisition of perfusion oxygenation data and pressure sensor data to ensure proper contact of the tester with the surface of the tissue region -target. [00126] 12. The configuration 11 system, wherein the array of sensors comprises one or more LEDs configured to emit light in the target tissue region at a wavelength coded for hemoglobin; and where the sensor array comprises one or more photodiodes configured to detect light reflected from the LEDs. [00127] 13. The configuration 12 system: where each of one or more LEDs comprises dual emitters configured to emit red (660 nm) and infrared (880 nm) light; where one or more LEDs are coupled to the conductive circuit; and where the lead circuit is configured to allow the red LED and the infrared LED to be independently driven sharing a common anode. [00128] 14. The configuration 11 system, further comprising: a graphical user interface; where the graphical user interface is configured to display perfusion oxygenation data and pressure sensor data. [00129] 15. The system of configuration 14, where the processing module is further configured to obtain readings from the sensor array to obtain data from the position of the verifier. [00130] 16. The system of configuration 15, where the processing module is additionally configured to interpolate the position data to generate a map of the oxygenation of the target tissue perfusion. [00131] 17. The configuration 16 system, where the processing module is configured to control the sampling of the pressure sensor and the sensor array for simultaneous acquisition of two or more data parameters selected from the group consisting of sensor data pressure, perfusion oxygenation data and position data to simultaneously display said two or more data parameters. [00132] 18. The system of configuration 16, where the processing module is configured to receive an image of the target tissue, and overlay the perfusion oxygenation map over the image. [00133] 19. The configuration 14 system, where the graphical user interface is configured to allow user input to manipulate the sensor array and pressure sensor settings. [00134] 20. The configuration 11 system, where the processing module further comprises: a filtering module; the filter module configured to filter in-band noise by subtracting data recorded when one or more light sources are in an “off” state from the data recorded when one or more light sources are in an “on” state. [00135] 21. A method for real-time monitoring of perfusion oxygenation of a target tissue region of a patient, comprising: positioning an array of sensors in contact with the surface of the target tissue region; emitting light from the light sources of the sensor array in the tissue-target region at a hemoglobin-encoded wavelength; receiving light reflected from light sources; obtaining the pressure data associated with the contact of the sensor array with the surface of the target tissue region; obtaining perfusion oxygenation data associated with the target tissue region; and sampling perfusion oxygenation data and pressure data to ensure proper contact of the sensor array with the surface of the target tissue region. [00136] 22. The method as mentioned in configuration 21: wherein the array of sensors comprises one or more LEDs configured to emit light in the target tissue region at a wavelength encoded for hemoglobin; and where the sensor array comprises one or more photodiodes configured to detect light reflected from the LEDs. [00137] 23. A method as mentioned in configuration 22: where each of the one or more LEDs comprises dual emitters configured to emit red (660 nm) and infrared (880 nm) light; the method further comprising the independent drive of the red LED and the infrared LED while the red LED and the infrared LED share a common anode. [00138] 24. A method as mentioned in embodiment 21, further comprising: simultaneous display of perfusion oxygenation data and pressure sensor data. [00139] 25. A method as mentioned in configuration 21, further comprising: acquiring readings from the sensor array to obtain data from the position of the verifier. [00140] 26. A method as mentioned in embodiment 25, further comprising: interpolating the position data to generate a map of target tissue perfusion oxygenation. [00141] 27. A method as mentioned in configuration 26, where the interpolation of the position data comprises the configuration of a Kriging algorithm on the acquired position data. [00142] 28. A method as mentioned in embodiment 26, further comprising: sampling the pressure sensor and the sensor array for simultaneous acquisition of pressure sensor data, perfusion oxygenation data, and position data; and simultaneous display of pressure sensor data, perfusion oxygenation data, and position data. [00143] 29. A method as mentioned in embodiment 26, further comprising: receiving an image of the target tissue; and overlaying the perfusion oxygenation map on the image. [00144] 30. A method as mentioned in configuration 21, further comprising: presenting a graphical user interface to allow user input; and manipulating the sampling settings of the sensor array and the user sensor. pressure according to said user input. [00145] 31. A method as mentioned in configuration 21, further comprising: cycling one or more light sources between a period when one or more light sources are on, and a period when one or more light sources are on a “off” state; and in-band noise filtering by subtracting recorded data when one or more light sources are off from when one or more light sources are in the “on” state. [00146] Although the above description contains many details, these should not be considered as limiting the scope of the invention, but simply offering illustrations of some of the presently preferred embodiments of this invention. Therefore, it will be appreciated that the scope of the present invention fully encompasses other configurations that may become obvious to those skilled in the art, and that the scope of the present invention should therefore be limited by nothing other than the appended Claims, in which a reference a given element is not intended to mean "one and only one" unless explicitly stated in that way, but rather "one or more". All structural, chemical and functional equivalents to the elements of the preferred configuration described above that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be covered by the present Claims. Furthermore, it is not necessary for a device or method to address each and every problem that seeks to be solved by the present invention to be covered by the present Claims. Furthermore, no element, component or method step of this disclosure is intended to be dedicated to the public regardless of whether the element, component or method step is explicitly mentioned in the Claims. No element of the Claim shall be considered under the provisions of 35 U.S.C. 112, paragraph six, unless the element is expressly mentioned using the phrase “means to”. [00147] SOURCE CODES ATTACHMENT [00148] The source code below is submitted as an example and not as a limitation, as a signal processing configuration of the present invention. Those skilled in the art will readily appreciate that signal processing can be performed in various other ways, which would be readily understood from the description presented here, and that signal processing methods are not limited to those illustrated in the code. source listed below. % clear all; clc; % % % Detect Heart Rate, Perfusion & SpO2 % % %% Input File %Perfusion =zeros (52.1); % for 11 = 0:51 % inputfile =strcat(‘3.2_s=10k_t=3s_p=5000u_duty=2500u_Richard_two_senso rs_volorarm _ch0=min=offset=2500um_volar_arm_chl=lcmCTtoCT=offset=0_’,nu m2str (11)); inputfile=‘gen3gen3r 10’; samplingRate = 10e3; % Sampling Rate in Hz period = 5e-3; % Period in duty = 2.5e-3; % Duty Cycle in s totalTime = 10; % Total File Time in s offsetR = 2.5e-3; % Red light offset in s offsetIR = Oe-3; % Red light offset in s transTime = 1.2e-4; % Rise/Fall time in s %% Heuristics for Peak Detection & Blood Oximetry RED_sens = 0.42; % Photodiode sensitivity @ 660nm in A/W IR_sens = 0.61; % Photodiode sensitivity @ 880nm in A/W MAX_HEART_RATE = 220; MIN_SAMP = 1/((period*5)*MAX_HEART_RATE/60); % Fastest Heartrate allowed %% Read Input File into Matlab sensorselect=3; . if sensorselect==l %5mm [PD1, PD2, PD3, PD4]=textread(inputfile, ‘%f%f%f%f%*[An]’, ‘delimiter’,’,’); %PD1 -> central photodiode (Channel 0); PD2 -> Drivesignal (Channel 1) elseif sensorselect==2 %10mm [PD2, PDl, PD3, PD4]=textread(inputfile, '%f%f%f%f%*[An]', 'delimiter' ,','); %PD1 -> photodiode panel (Channel 0); PD2 -> Drive signal (Channel 1) elseif sensorselect==3 ' [PD2, PD3, PD1, PD4]=textread(inputfile, '%f%f%f%f%*[An]', 'delimiter' ,','); %PD1 -> photodiode panel (Channel 0); PD2 -> Drive 'signal (Channel 1) elseif sensorselect==4 [PD2, PD3, PD4, PD1]=textread(inputfile, '%f%f%f%f%*[An]', 'delimiter' ,',') % PD1 -> central photodiode (Channel 0); PD2 -> Drive signal (Channel 1) end PD1=-PD1; ‘ % if trial==3 % PD1 = PD1 (length(PD1)/2+1:end); . % end % Data=DownloadFromDB(); %PD1 = Date(1:end,1); %PD2 = Date(1:end,2); No_RIR_Waves = totalTime/period; % Total # of RED+IR square waves %% Noise Cancellation %% % % % 1. single-sided subtraction %% % averageRed = zeros (No_RlR_Waves, 1); averageRedStep1=zeros(No_RIR_Waves, 1); averageRedStep2=zeros(No_RIR_Waves, 1); averageIR = zeros(No_RIR_Waves, 1); averageNoise_1 = zeros(No_RIR_Waves, 1); % 1st off portion in each period averageNoise_2 = zeros(No_RIR_Waves, 1); % 2nd off portion for i=0:No_RIR_ Waves-1 for j=1:(duty-transTime)*samplingRate % Average every period averageRed(i+1, 1) = averageRed(i + 1, 1) + PD1(ceil (i*period*samplingRate+j+offsetR*samplingRate+transT ime*samplingRate)); %averageIR(i+1, 1) = averageIR(i+1, 1) + PD1(floor(i*period*samplingRate+j +offsetIR*samplingRate+tran sTime*samplingRate)); end % for j=1:(duty/2)*samplingRate % Average every period, in transition time because LED is already on, changes are very short % averageRedStep1(i+1, 1) = averageRed(i+1, 1) + PD1(ceil(i*period*samplingRate+j+offset*samplingRate+transTi me*samplingRate)); . % averageRedStep2(i+1, 1) = averageRed(i+1, 1) + PD1(ceil(i*period*samplingRate+j+offsetR*samplingRate+transT ime*samplingRate+floor((duty/2)*samplingRate) )); % %averageIR(i+1, 1) = averageIR(i+1, 1)+ PD1(floor(i*period*samplingRate+j + offsetIR*samplingRate+tran sTime*samplingRate)); % end for j=1:(period-duty-transTime)*samplingRate %Averaging the off portion for noise subtraction % averageNoise_1(i+1, 1) = averageNoise_1(i+1, 1) + PD1(floor(i*period) *samplingRate+j+transTime*samplingRate));averageN oise_1(i+1, 1) = averageNoise_1(i+1, 1) + PD1 (max (2, floor (i*period*samplingRate+j +transTime*samplingRa te) -(period-duty-offsetR-transTime)*samplingRate))); %averageNoise_2(i + 1, 1) = averageNoise_2(i+1, 1) + PD1(floor(i*period*samplingRate+j+(offsetR+duty)*samplingRat e)); end averageRed(i+1, 1) averageRed (i+1, 1) /floor ((duty transTime)* samplingRate); %averageiR(i+1, 1) averageiR(i+1, 1)/((dutytransTime)* samplingRate); % averageRedStep1 (i+1, 1) = averageRedStep1 (i+1, 1) /floor((duty/2)*samplingRate); % averageRedStep2(i+1, 1) = averageRedStep2(i+1, 1)/floor ((duty/2)*samplingRate); averageNoise_1(i+1, 1) = averageNoise 1(i+1, 1)/floor((period-dutytrans Time)* samplingRate); % Use period/2 when using both red and IR %averageNoise_2(i+1, 1) = averageNoise_2(i+1, 1)/ ((period/2-dutytransTime)* samplingRate); end , averageRed_1 =averageRed-averageNoise_1; averageRed_step =averageRedStep2-averageRedStep1; %averageIR_1 = averageIR - averageNoise_2; averageRed_4 = zeros(No_RIR_Waves/5, 1); , averageIR_4 = zeros(No_RIR_Waves/5, 1); for i=1: (No_RIR_Waves/5) . for j=1:5 averageRed_4(i) = averageRed_4(i)+averageRed_1((i-1)*5+j); % averageIR_4(i) = averageIR_4(i)+averageIR_1((i-1)*5+j); end averageRed_4(i) = averageRed_4(i)/5; % averageIR_4(i) = averageIR_4(i)/5; end % % % % %2. double-sided subtraction % % % averageNoise_Red = (averageNoise_1 + averageNoise_2) ./2; % Average the off portion on two sides of one on portion averageNoise_IR = (averageNoise_ 1(2:end) + averageNoise_2(1:end1)) ./2; averageIR_2 = zeros(No_RIR_Waves, 1); ■ averageRed_2 = averageRed - averageNoise_Red; - averageIR_2(1:end-1) = averageIR(1:end-1) - averageNoise IR; averageIR_2(end) = averageIR(end) - averageNoise_2(end); % Last period of IR uses single-sided subtraction % % % % 3. interpolation subtraction % % % % Noise_raw = zeros(totalTime * samplingRate, 1); % Store the low-pass-filtered off portion continuously %x_Noise = zeros (floor(offsetR*samplingRatetransTime* sam plingRate)+floor(offsetIR*samplingRate(offsetR*samplingRate+ (duty+transTime)*samplingRate))*No_RIR_Waves,1); % coordinates of Noise_raw % x_Noise_x = 0; % Noise_raw_0 = zeros(totalTime * SamplingRate, 1); % % for i=0:No_RIR_ Waves-1 % for j=1:period*samplingRate % if (((j<=offsetR*samplingRate)&&j>transTime*samplingRate)) | ((j> (offsetR*samplingRate + (duty+transTime)*samplingRate)) && (j <=offsetIR*samplingRate))) % load off portion to Noise_raw % Noise_raw_0(floor(i*period*samplingRate+j)) = PD1 (floor(i *period*samplingRate+j)); % end % end % end % % order = 50; % Pre-low pass filter for spline interpolation % cutoff = 200/samplingRate; % Cut off frequency = 100 Hz % yl = firl(order, cutoff,’low’); %PD1_LPF = filtfilt(y 1,1,Noise_raw_0); % % for i=0:No_RIR_Waves-1 % for j=1:period*samplingRate % if (((j<=offsetR*samplingRate)& (j>transTime* amplingRate)) | ((j> (offsetR*samplingRate)) + (duty+transTime)*samplingRate)) && (j <= offsetIR*samplingRate))) % load off portion to Noise_raw % x_Noise_x = x_Noise_x +1; % Noise_raw(x_Noise_x) = PD1_LPF(floor(i *period*samplingRate+j)); % % x_Noise(x_Noise_x) = floor(i*period*samplingRate+j); end % end % end % % % Noise = interpl(x_Noise,Noise_raw(1:x_Noise_x),1:samplingRate*totalT ime,’spline ‘); % Noise interpolation % PD_N = PD1 - Noise’; % % % averageRed_3_1 = zeros(No_RIR_Waves, 1); % averageIR_3_1 = zeros(No_RIR_Waves, 1); % for i=0:No_RIR_Waves-1 % Average data in each square wave period % for j=1:floor((duty-transTime)*samplingRate) % averageRed_3_1(i+1, 1) = averageRed_3_1(i+1, 1 ) + PD_N(floor(i*period*samplingRate+j+offsetR*samplingRate+tran sTime*samplingRate)); % averageIR_3_1 (i+1, 1) =.averageIR_3_1(i+1, 1) + PD_N(floor(i*period*samplingRate+ j + offsetIR*samplingRate+tra nsTime*samplingRate)); % end % averageRed_3_1(i+1, 1) = averageRed_3_1(i+1, 1)/(floor((dutytransTime)* samplingRate)) % averageIR_3_1(i+1, 1) = averageIR_3_1(i+1, 1)/ (floor((dutytransTime)* samplingRate)); % averageiR_3_1(i+1, 1) averageiR_3_1(i+1, 1)/(floor((dutytransTime)* samplingRate)); % end % averageiR_3 1(end) averageiR_3 _1 (end-1); % Abandon the last one of IR 3 to eliminate error caused by interpolation %% Create a Low-pass and Filter Waveforms averageRed 1; g. o 1, 2, 3, 4 correspond to single-sided subtraction, double-sided subtraction, interpolation subtraction & average of every 5 points averageIR = zeros(length(averageRed_1),1); order = 100; cutoff = 10/(1/period); y = firl(order, cutoff,’low’); x = filterfilt(y, 1, averageRed); z = filterfilt(y, 1, averageIR); [dec,lib] = wavedec(averageRed,2,’dblO’); a2 = wrcoef(‘a’,dec,lib,’dblO’,2); %Perfusion(11+1) = mean(x); %end %% End of Loop % % Pre-LPF for interpolation % % order = 100; % % cutoffl = 40 /(1/period); % % yl = firl(order, cutoff1,’low’); % % xl = filtfilt(y1, 1, averageRed); % % zl = filtfilt(y1, 1, averageIR); % % % % freqz(y) % view filter % % numvg = 100; runavg = ones(l, numvg)/numavg; x_avg = filtfilt(runavg, 1, averageRed); z_avg = filtfilt(runavg, 1, averageIR); %x = x - x_avg; %z = z - z_avg; time = (1:No_RIR_Waves)/(No_RlR_Waves)*totalTime % % % Red LED % % figure; subplot(2, 1, 1) hold on; plot(time, averageRed*lE3, ‘-k’, ‘linewidth’, 2) plot(time, x*1E3, ‘-r’, linewidth’, 2); plot(time, x_avg*1E3, ‘-b’, ‘linewidth’, 2); hold off; ylabel('Recived Signal [mV]', 'fontsize', 14, 'fontweight', 'bold') xlabel('Time, [s]', 'fontsize', 14, 'fontweight', 'bold') set( gca, 'linewidth', 2, 'fontsize', 10, 'fontweight', 'bold') legend('Red LED', 'Red LED (LPF)', 'Running, Average', 'Orientation','horizontal' ) title('Red LED','fontsize', 14, 'fontweight', 'bold') box on; heart_beat_RED = x-x_avg; wavelet_RED = a2-smooth(a2,200); %heart_beat_RED = wavelet_RED; % % Detect Heat Beat Peaks FAIL 202C VERSION ‘ % temp = sign(diff(heart_beat_RED)); % % temp = sign(diff(x(order+numavg/2:end-numavg/2-1))); % temp2 = (temp(1:end-1)-temp(2:end))./2; % loc = find(temp2 ~=0); %loc = [loc(1); loc(find(diff(loc) > MIN_SAMP/2) +1) ]; % peaks1 = loc(find(temp2(loc) > 0))+1; % peaks1 = peaks1(find(heart_beat_RED(peaks1) >0)); % valleys1 = loc(find(temp2(loc) < 0))+1; % valleys1 = valleys1(find(heart_beat_RED(valleys1) < 0)); %peak detection that actually works: peaks-[]; widthp=50; for j = 1:totalTime/period if heart_beat_RED(j)==max(heart_beat_RED(max(1,jwidthp): min(totalTime/period,j+widthp))) peaks(end+1)=j; end end valleys=[]; widthv=50; for j = 1:totalTime/period if heart_beat_RED(j)==min(heart_beat_RED(max(1,jwidthv): min(totalTime/period,j+widthv))) valleys(end+1)=j; end end diffzs=[]; widthd=25; diff_hb = diff(heart_beat_RED); for j = 1:totalTime/period-1 if abs(diff_hb(j))==min(abs(diff_hb(max(1,jwidthd): min(totalTime/period-1,j+widthd))))) diffzs( end+1)=j; end end killthese=[]; for j=1:numel(diffzs) for k=1:numel(peaks) if abs(diffzs(j)-peaks(k))<25 killthese(end+1)=j; end for k=1:numel(valleys) if abs(diffzs(j)-valleys(k))<25 killthese(end+1)=j; end end end peakspacing(j) = min (abs (diffzs (j)-peaks)); valleyspacing(j)= min(abs(diffzs(j)-valleys)); end diffzs(killthese)=[]; peakspacing(killthese)=[]; %clean up peaks/valleys to make them match 1:1 delp=[]; for i =1:length(peaks)-1 valid=0; for j = 1:length(valleys) if peaks(i+1)>valleys(j) && peaks(i)cvalleys(j) valid=l; break end end if ‘ valid==0 && heart_beat_RED(peaks(i + 1))<heart_beat RED(peaks(i)) delp(end+1)=i+1; elseif valid==0 delp(end+1)=i; end end ‘ peaks (delp) = [ ]; delv=[]; for i = 1:length(valleys)-1 valid=0; for j = 1:length(peaks) if valleys(i + 1)>peaks(j) && valleys(i)<peaks(j) valid=1; break end end if valid==0 && heart_beat_RED(valleys(i+1))>heart_beat_RED(valleys(i)) delv(end+1)=i + l; elseif valid==0 delv(end+1)=i; end end valleys(delv)=[]; %finish of cleanup mdiffzs = median(heart_beat_RED(diffzs)); mpeaks = median(heart_beat_RED(peaks)); mvalleys = median(heart_beat_RED(valleys)); secondpeak = (mdiffzs-mvalleys)/(mpeaks-mvalleys); peakspacing = median(peakspacing); valleyspacing = median(valleyspacing); subplot(2, 1, 2) hold on; plot(time, heart_beat_RED* 1E3, ‘-k’, ‘linewidth’, 2); % ylim([-1.5 1.5]) plot(time(peaks), heart_beat_RED(peaks)*1E3, ‘or’, ‘linewidth’, 2, ‘markersize’, 12); plot(time(valleys), heart_beat RED(valleys)*1E3, ‘ob’, ‘linewidth’, 2, markerssize’, 12); plot(time(diffzs), heart beat RED(diffzs)*1E3, ‘og’, ‘linewidth’, 2, ‘markersize’, 12); hold off; ylabel('Heart Beat [mV] ', 'fontsize', 14, 'fontweight', 'bold') xlabel('Time [s]', 'fontsize', 14, 'fontweight', 'bold') set(gca ,'linewidth', 2, 'fontsize', 10, 'fontweight', 'bold') box on; Heart_Rate_RED = length(peaks)/(time(end)-time(1))*60; %% % % % IR LED %% % % figure; % subplot(2, 1, 1) % hold on; % plot(time, averageIR* 1E3, ‘ - k’ ‘ ‘linewidth’, 2); % plot(time, z*1E3, ‘-r’, ‘linewidth’, 2); % plot(time, z_avg*lE3, ‘-b’, ‘linewidth’, 2); % hold off; % ylabel ('Recived Signal [mV]', 'fontsize', 14, 'fontweight', 'bold') % xlabel('Time [s] ', 'fontsize', 14, 'fontweight', 'bold') % set(gca,'linewidth', 2, 'fontsize', 10, 'fontweight', 'bold') % legend('IR LED', 'IR LED (LPF)', 'Running Average', 'Orientation', ' horizontal') % title('IR LED', 'fontsize', 14, 'fontweight', 'bold') % box on; % % heart_beat_IR = z-z_avg; % % % Detect Heat Beat Peaks % temp = sign(diff(heart_beat_IR)); % % temp = sign(diff(z(order+numavg/2:end-numavg/2-1))); % temp2 = (temp(1:end-1)-temp(2:end))./2; % loc = find(temp2 ~= 0); %loc = [loc(1); loc(find(diff(loc) > MIN_SAMP/2)+1)]; % peaks2 = loc(find(temp2(loc) > 0))+1; % peaks2 = peaks2(find(heart_beat_IR(peaks2) > 0)); % valleys2 = loc(find(temp2(loc) < 0))+1; % valleys2 = valleys2(find(heart_beat_IR(valleys2) < 0)); % % subplot(2,1,2) % hold on; % plot(time, heart_beat_IR*lE3, ‘-k’, ‘linewidth’, 2); %ylim([-1.5 1.5]); % plot(time(peaks2), heart_beat_IR(peaks2)*1E3, ‘or’, ‘linewidth’, 2, ‘markersize’, 12); % plot(time(valleys2), heart_beat_IR(valleys2)*1E3, ‘ob’, ‘linewidth’, 2, ‘markersize’, 12); % hold off; % ylabel('Heart Beat [mV]', 'fontsize', 14,'fontweight', 'bold') % xlabel('Time [s]', 'fontsize', 14, 'fontweight', 'bold') % set(gca,'linewidth', 2, 'fontsize', 10, 'fontweight', 'bold') % box on; % Heart_Rate_IR = length(peaks2)/ (time (end)-time (l))*60 % % % % % SpO2 % % % % H_heart_beat_Red_peak = interp1(peaks1,x(peaks1),1:length(time),' spline'); Interpolate the peak value of heart beat (RED) for whole time range % H_heart_beat_IR_peak = Interp1(peaks2,z(peaks2),1:length(time),’spline’); Interpolate the peak value of heart beat (IR) for whole time range % % H_heart_beat_Red_valley = Interp1 (valleys1,x(valleys1),1:length(time), ‘spline’); % Interpolate the valley value of heart beat (RED) for whole time, range % H_heart_beat_IR_valley = Interp1(valleys2,z(valleys2),1:length(time), 'spline');' Interpolate the valley value of heart beat (IR) ) for whole time range % % % Superposition % x2 = zeros(length(x1),1); %z2 = zeros(length(z1),1); % for i=2:length(peaks1)-1 % x2 (1:end-(peaks1(i)-peaks1(2)))= x2(1:end-(peaks1(i)-peaks1(2))) + xl(peaksl(i)-peaksl(2)+1:end); % z2(1:end-(peaks2(i)-peaks2(2))) = z2(1:end-(peaks2(i)-peaks2(2))) + zl(peaks2(i)-peaks2(2) +1:end); % end % x2 = x2/(length(peaksl)-2); %z2 = z2/(length(peaks2)-2); % % % % H_heart_beat_Red = filtfilt(runavg, 1, H_heart_beat_Red); % % H_heart_beat_IR = filtfilt(runavg, 1, H_heart_beat_IR); % % % R_red = H_heart_beat_Red_valley./(H_heart_beat_Red_peak); % R_IR = H_heart_beat_IR_valley./(H_heart_beat_IR_peak); % % R = (log(R_red)./log(R_IR))*(RED_sens/IR_sens); % 02 = (0.81-0.18.*R)./(O.63+0.1 l.*R)*100; % SpO2 = mean(02) % % figure; % hold on; % plot(time, 02, ‘-r’, ‘linewidth’, 2); % ylabel('SpO2', 'fontsize', 14, 'fontweight', 'bold') % xlabel('Time [s]', 'fontsize', 14, 'fontweight', 'bold') % set(gca, 'linewidth', 2, 'fontsize', 10, 'fontweight' 'bold') % ylim([90 110]) % box on; x=[]; hrdata=[]; pdiff=[]; secpeak=[]; trial=1; for trial =1:1 for filenum =1:1 for sensorselect=4 inputfile = ['ir+' num2str(min(trial,2) '.'num2str(filenum)]; inputfile = 'all+'; %inputfile = [' height5s_stoy' num2str(filenum)]; multilevel_extract; hrdata(:,filenum) = heart_beat_RED; dcdata(filenum) = median(x_avg); %if nnz(x(:,filenum))==0; break; end r (filenum) = Heart_Rate_RED; vs=min(numel(peaks), numel(valleys)); p2pdata(filenum) = median(heart_beat_RED(peaks(1 :vs))heart_beat_RED(valleys(1:vs))); =[]; for i=2:numel(valleys)-2 en(end+l) = sum(heart_beat_RED(valleys(i):valleys(i+1)).A2); end benergy(filenum)=median( en); riset=[]; fallt-[];. if peaks(1)>valleys(1) for i=1:vs-l riset(end+l) = peaks(i)-valleys(i); fallt (end+1) = valleys(i+1)-peaks(i); end else for i=1:vs-1 riset(end+1) = peaks(i+1)-valleys(i); fallt(end +1) = valleys(i)-peaks(i); end end risetime(filenum)=median(riset); falltime(filenum)=median(fallt); for repeat=1:3 if peaks(1)Cvalleys(1 ); peaks(l)=[]; end end for i=1:floor(numel(peaks)/2) list_pdiff(i) = heart _beat_RED(peaks(2*i- l))heart beat_RED(peaks(2 *i)); end pdiff(filenum)=median(list_pdiff); secpeak(filenum)=secondpeak; peakspace(filenum) =peakspacing; valspace(filenum)=valleyspacing; medpeak(filenum) = mpeaks-mvalleys; end % suffix = ‘pressure’; % presf=csvread([inputfile suffix]); % presdata(filenum)=mean((presf(:,2)-.6)/2.8); end stoyrt(trial,:)=risetime*.005; stoyft(trial, :)=falltime * .005; stoyhr(trial,:)=r; stoysecpeak(trial, :)=secpeak; stoypeakspace(trial,:)=peakspace*.005; stoyvalspace(trial, :)=valspace* .005; stoymp(trial, :)=medpeak; end % stoyfts=stoyft./(min(stoyft’)’* [1 1 1 1 1]); % stoyrts=stoyrt./(min(stoyrt’)’*[1 1 1 1 1]); % stoysecpeaks=stoysecpeak./(min(stoysecpeak’)’*[1 1 1 1 1]); % stoymps=stoymp’. / (min (stoymp’)’*[ 1 1 1 1 1]); % % % for i=1:3;corroef(stoyhr(i,:),stoyrt(i,:)) % end % for i=1:3;corroef(stoyhr(i,:),stoyft(i,:) )) % end % for i=1:3;corrcoef(stoyhr(i,:),stoysecpeak(i,:)) % end . % % for i=1:3;corroef(stoybps(istoyrt(i,:)) % end % for i=1:3;corroef(stoybps(i, :),stoyft(i,:)) % end % for i=1:3;corroef(stoybps(i,:),stoysecpeak(i,:)) % end % for i=1:3;corroef(stoybps(i,:),stoyhr(i,:)) % end % % for i=1:3;corroef(stoybpd(i,:),stoyrt(i,:)) % end % for i=1:3;corroef(stoybpd(i,:),stoyft(i,:) ) % end % for i=1:3;corroef(stoybpd(istoysecpeak(i,:)) % end % for i=1:3;corroef(stoybpd(i,:),stoyhr(i,:)) % end % % peaks=[]; % for j = 1:4000 % if x(j,filenum)>5e-5 && x(j,filenum)==max(x(max(1, j75) min(4000,j) +75),filenum)) % peaks(end+1)=j; % end % end % % % % for j = 1:4000 % if heart_beat_RED(j)>5e-5 && heart_beat_RED(j)==max( heart_beat_RED(max(1,j-75):min(4000,j+75))) % peaks(end+1)=j; % end % end % t= 1: 4% figure % plot(t,stoylbpd, 'o',t,stoy2bpd,'o', t,stoy3bpd,'o') % axis([.5 4.5-1 1]) % set(gca,'XTick', 1:4) % set(gca, 'XTickLabel', {'Rise Time' 'Fali Time' 'Second Peak Strength' 'Heart Rate' }) % legend({Trial 1' 'Trial 2' 'Trial 3'}) % title('Correlations: Met rics vs . Diastolic Blood Pressure, Henrik') % ylabel('Correlation Coefficient') % figure % plot(t,stoylbps, 'o',t,stoy2bps, 'o',t,stoy3bps, 'o') % axis([.5 4.5-11]) %set(gca,'XTick',1:4) % set(gca,'XTickLabel', {'Rise Time' 'Fali Time' 'Second Peak Strength' 'Heart Rate'}) % legend( {'Trial 1' 'Trial 2' 'Trial 3'}) % title('Correlations: Metrics vs. Systolic Blood Pressure, Henrik') % ylabel('Correlation Coefficient') % figure % plot(t,stoylhr,'o ',t,stoy2hr,'o',t,stoy3hr,'o') % axis ([.5 4.5-1 1 ]) %set(gca,'XTick',1:4) % set(gca,'XTickLabel ', {'Rise Time' 'Fali Time' 'Second Peak Strength' 'Heart Rate'}) % legend({'Trial 1' 'Trial 2' Trial 3'}) % title('Correlations: Metrics vs. Heart Rate , Henrik') % ylabel('Correlation Coefficient') function [ pointcoords ] = rgbfind(filename) im_unfiltered = imread(filename); %[y x rgb] %h = fspecial (‘gaussian’, 1.0, 10); %im=imfilter(im_unfiltered,h); im=im_unfiltered; r = im(:, :, 1); g = im(:, :, 2); b = im(:, :, 3); % image(im); %goal rgb = 0.160,170 goalr = 0; goalg = 160; goalb = 170; tol=50; %goal offset tolerance match=zeros(size(im,1),size(im,2),2); for y = l:size(im,l) for x = l:size(im,2) if (r(y,x)>goalr+tol) | (r(y,x) <goalr-tol) . . . | (g(y,x)>goalg+tol) | (g(y,x) <goalg-tol) . . . | (b(y,x)>goalb+tol) | (b(y,x)<goalb-tol) %not a match %match(y,x,:)=[0,0,0]; else %match match (y, x, :) = [1.0]; end end end numblobs=0; blob=[]; for y= l:size(im,1) for x = l:size(im,2) if match(y,x,1)==1. %these matches are already in blobs if match(y-1,x+2,l)==l match(y,x,2)=match(y-1,x+2,2); blob(match(y-1,x+2,2)).x(end+1)=x; blob(match(y-1,x+2,2)) .y(end+1)=y; elseif match(y-1,x+1,l)== 1 match(y,x,2)=match(y-1,x+1,2); blob(match(y-1,x+1,2)).x(end+1)=x; blob(match(y-1, x+1,2)).y(end+1)=y; elseif match(y-1,x,1)==l match(y,x,2)=match(y-1,x,2); blob(match(y-1,x,2)).x(end+1)=x; blob(match(y-1,x,2)).y(end+1)=y; elseif match(y-1,x-1,1)==1 match(y,x,2)=match(y-1,x-1,2); blob(match(y-1,x-1,2)).x(end+1)=x; blob(match(y-1,x-1,2)).y(end+1)=y; elseif match(y,x-1,1)=1 match(y,x,2)=match(y,x-1,2); blob(match(y,x-1,2)) .x(end+1)=x; blob(match(y,x-1,2)).y(end+1)=y; %other matches require new blob else%if match(y+1,x-1, 1)== 1 numblobs = numblobs+1; match(y,x,2)=numblobs; blob(numblobs).x=x; blob(numblobs). y=y; end end end end merged=zeros(1,numblobs); figure();image(match(:,:,2)+l); for y = size(im,1):-1:1 for x = size(im,2):-1:1 if match(y,x,1)==1 %these matches are already in blobs if (match (y,x+1,1)==1) && (match(y,x,2)~=match(y,x+1,2)) merged(match(y,x,2))=match( y,x+1.2); match(y,x,2)=match(y,x+1,2); blob(match(y,x+1,2)).x(end+1)=x; blob(match(y,x+1,2)).y(end+1)=y; elseif match(y+1,x+1,l)==l && match(y,x,2)~=match(y+1,x+1,2) merged(match(y,x,2)) =match(y+1,x+1.2); match(y,x,2)=match(y+1,x+1.2); blob(match(y+1,x+1,2)).x(end+1)=x; blob(match(y+1,x+1,2)).y(end+1)=y; elseif match(y+1,x,1)==1 && match(y,x,2)~=match(y+1,x,2) merged(match(y,x,2))=match(y +1.x.2); match(y,x,2)=match(y+1,x,2); blob(match(y+1,x,2)).x(end+1)=x; blob(match(y+1,x,2)) .y(end+1) =y; elseif match(y+1,x-1,1)==1 && match(y,x,2)~=match(y+1,x- 1,2) merged(match(y,x,2)) =match(y+1,x-1.2); match(y,x,2)=match(y+1,x-1.2); blob(match(y+1,x-1,2)) .x(end+1)=x; blob(match(y+1,x-1,2)).y(end+1)=y; end end end end for y = size(im,1):-1:1 for x = size(im,2):-1 :1 if match(y,x,l)==1 if merged(match(y) ,x,2))>0 while merged(match(y,x,2))>0 match(y, x,2)=merged(match(y,x,2)); blob(match(y,x,2)).x(end+1)=x; blob(match(y,x,2)) .y(end+1)=y; end;end;end;end;end blob(find(merged))=[]; pointcoords= [ ]; for i=1:size(blob,2) pointcoords(i,:)=[mean(blob(i).y);mean(blob(i.x)]; end pointcoords=round (pointcoords); figure();imshow(match(:,;,1)); figure(); image(match(:,:,2) + 1); %+(match(:,:,2)>0)*3 end function [ exppic ] = imoverlay(pcs, im, impic) p1 = pcs (1, :); p2 = pcs(2,:); p3 = pcs (3, :); d1=pl(1)-pl(2); d2=p2(1)-p2(2); d3=p3(1)-p3(2); s1=p1(1)+p1(2); s2=p2(1)+p2(2); s3=p3(1)+p3(2); [a,v] = max([d1 d2 d3]); [a,t] = min([s1 s2 s3]); [a,r] = max([s1 s2 s3]); %hyp = sqrt((pcs(v,1)-pcs(t,1))A2 + (pcs(v,2)-pcs(t,2)) A2); %adj = sqrt((pcs(v,1)-pcs(r,1)) A2 + (pcs(v,2)-pcs(r,2)) A2); %angle=atand(adj/hyp); ratio = (pcs(v,1)-pcs(t,1)) / (pcs(t,2)-pcs(v,2)); angle=atand(ratio); hangle = -l*(90 - angle); hoffset=(pcs(r,1)-pcs(t,1)- (pcs(t,2)-pcs(r,2))*tand(angle) ) * cosd(angle); scale=hoffset/size(im,2); imout = imresize(im,scale); padout = ones(size(imout)); padout - imrotate(padout, hangle); imout = imrotate(imout,hangle); sp=[00]; if hangle<0 for x=1:size(padout,2) for y=size(padout, 1):-1:1 if padout(y,x)==1 sp = [y x]; break end end if sp; break; end end else for y=size(padout,1);-1:1 for x=1:size(padout,2) if padout(y,x)==1 sp = [y x]; break end end if sp; break; end end end offy = pcs(v,1)-sp(1); offx = pcs(v,2)-sp(2); exp = zeros(size(impic)); exppic = exp; for y=1:size(padout,1) for x=1:size(padout,2) xcoord = max(1,offx+x); xcoord = min(xcoord,size(exp,2)); ycoord = max(1,offy+y); ycoord = min(ycoord,size(exp,1)); exp(ycoord,xcoord,:)=padout(y,x,:) exppic(ycoord,xcoord,:)=imout(y, x, :); end image(impic); hold on hobject = image(exppic/255); hold off set(hobject, ‘AlphaData’,exp(:, :, 1) /2); end function [ imdata ] = mapData(filename, ploten) %MAPDATA Summary of this function goes here % Detailed explanation goes here temp = csvread(filename); log_sp02 = temp(1,:); log_pressure - temp(2,:); log_x = temp(3,:); log_y = temp(4, :); clear temp; vais = []; log_x = abs(min(log_x))+log_x; log_y = abs(min(log_y))+log y; i=0; while i<numel(log_sp02) i=i+1; if log_sp02(i)<10 log_sp02(i)=[]; log_pressure(i)=[]; log_x(i)=[]; log_y(i)=[]; end end % for i=1:size(log_sp02,2) grid = zeros (floor((max(log_y))/5)+1 floor((max(log_x))/5)+1); [X, Y] = meshgrid(1:5:(max(log_x)), 1: 5 :(max(log_y))); while numel(log_sp02)>0 i=l; xmatch = find(log_x==log_x(i)); . ymatch = find(log_y==log_y(i)); match = intersect (xmatch, ymatch); vais(end+1,:) = [log_x(i) log_y(i) max(log_sp02(match))]; % grid(log_y(i)+1,log_x(i)+1) = max(log_sp02(match)); log_sp02(match)=[]; log_pressure(match)=[]; log_x(match)=[]; log_y(match)=[]; end %plot(sqrt(vais(:,1). A2 + vais (:,2) .A2),vais(:,3)); anisotropy = 1; %range x / range y alpha = 0; %angle between axis/anisotropy in degrees nu = 1; %; nu for covariance vgrid = [5 5]; [kout evar]= vebyk(vais,vgrid,5,anisotropy,alpha,nu, 1,0,0 for i=1:size(kout,1) if (size(grid,2)-1 < kout(i, l)/5) | (size(grid,1)-1 < kout(i, 2)/5) continue; end grid(kout(i,2)/5+1, kout(i, 1)/5 + 1)=kout (i,3); end %image(grid); imdata=[]; if ploten figure(); surf (X, Y, grid.); else imdat = ((grid-min(min() grid))) *255/(max(max(grid)) min(min(grid)))); rgbdata = ind2rgb(round(imdat),jet(256)); imwrite(rgbdata,'d_image.jpg', ' jpg') imdata=rgbdata; end end Figure Legends Figure 1. A) Processing Module B) Application Module C) Hardware D) Amplifier and Filter E) Pressure Sensor F) Filtration G) Processing Scripts H) Base Server Data I) Graphical User Interface J) Data Acquisition Unit K) Photodiode Array L) Patient Tissue M) Perfusion Data N) User O) Feature Extraction P) Hardware Configuration Q) Control R) Data S) LED array T) Intensity Controller U) Patient Figure 11 A) PC [Computer] B) NIDAQ C) S Digital TTL output D) Analog DC Input E) Application module F) Processing module Figure 15 A) Verification area Figure 17 1. Perfusion Data 8. Motion Noise Removal 9. RF Noise Filtering 10. RF Noise Extraction Data 11. Interpolation and Overlay 12. Location Mapping 13. Position Data Extraction 14. Feature Extraction 15. Data (From Database)
权利要求:
Claims (27) [0001] 1. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, characterized in that it comprises: a scanner (16), comprising: a set of flat sensors (46); the sensor array (46) configured to be positioned in contact with a surface of the target tissue region (18); the sensor array (46) comprising one or more LED(s) (44) configured to emit light to the target tissue region (18) at a wavelength adjusted for hemoglobin; the sensor array (46) comprising one or more photodiodes (62) configured to detect light reflected from the LEDs (44); a data acquisition controller (40) coupled to one or more LED(s) (44) and one or more photodiode(s) (62) to control the emission and reception of light from the array of sensors (46) to obtain perfusion oxygenation data associated with the target tissue region (18); an intensity controller (42) comprising a light source driver circuit (100) and electronically connected to the data acquisition controller (40), wherein the intensity controller (42) is configured to control the intensity output. one or more LEDs (44) to penetrate light throughout the target tissue region (18); and a processing module (12) coupled to the data acquisition controller (40), wherein the processing module (12) is configured to obtain readings from the sensor array (46) to obtain positional data from the scanner ( 16), the processing module (12) configured to generate a perfusion oxygenation map of the target tissue region (18) as a function of the acquired position data and perfusion oxygenation data, wherein the perfusion oxygenation map represents the levels of spatial distribution of oxygen and depth penetration across the target tissue region (18). [0002] Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 1, wherein the scanner (16) further comprises: a pressure sensor (50) coupled to the sensor array (46); the pressure sensor (50) configured to obtain pressure readings from the sensor array contacting a surface of the target tissue region (18); characterized in that the scanner (16) is configured to obtain pressure sensor readings while obtaining perfusion oxygenation data to ensure correct contact of the scanner with the surface of the target tissue region (18) . [0003] 3. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 2: characterized in that the pressure sensor (50) and sensor set (46) are connected to a first side (66) of a printed circuit board (PCB) (60); and wherein the data acquisition controller (40) is connected to the PCB (60) on a second side (68) opposite said first side (66). [0004] 4. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 1, characterized in that each LED (64) comprises dual emitters configured to emit red (660nm) and infrared ( 880nm). [0005] 5. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 4: characterized in that one or more of the LEDs (44) are coupled to the actuator circuit (100); and wherein the driver circuit (100) is configured to allow the red LED emitter and the infrared LED emitter to be driven independently while sharing a common anode. [0006] Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient according to Claim 5, characterized in that the actuator circuit (100) comprises an amplifier (110); and a field effect transistor (112) configured to provide negative feedback. [0007] 7. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of the Patient, according to Claim 2, characterized in that the processing module (12) is further configured to control the sampling of the pressure sensor ( 50) and sensor assembly (46) for simultaneous acquisition of pressure sensor data and perfusion oxygenation data. [0008] 8. Apparatus for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient, according to Claim 7, characterized in that the processing module (12) is configured to control the sampling of the pressure sensor (50 ) and the sensor set (46) for simultaneous acquisition of two or more data parameters selected from the group consisting of pressure sensor data, perfusion oxygenation data and positional data and for simultaneously displaying said two or more parameters of Dice. [0009] 9. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, characterized in that it comprises: (a) a scanner (16), comprising: a set of flat sensors (46); the sensor array (46) configured to be positioned in contact with a surface of the target tissue region (18); the sensor array (46) comprising one or more light source(s) (44) configured to emit light to the target tissue region (18) at a wavelength adjusted to hemoglobin; the sensor assembly (46) comprising one or more sensor(s) (62) configured to detect light reflected from one or more light sources (44); a pressure sensor (50) coupled to the sensor array (46); the pressure sensor (50) configured to obtain pressure readings from the sensor array contacting a surface of the target tissue region (18); and (b) a data acquisition controller (40) coupled to one or more sensor(s) (62) configured to control the emission and reception of light from the sensor array (46) and obtain perfusion oxygenation data associated with the target tissue region (18); (c) an intensity controller (42) comprising a light source driver circuit (100) and electronically connected to the data acquisition controller (40), wherein the intensity controller (42) is configured to control the output intensity of one or more light sources (44) to penetrate light throughout the target tissue region (18); and (d) a processing module (12) coupled to the data acquisition controller (40); the processing module (12) configured to control the sampling of the pressure sensor (50) and the sensor assembly (46) for simultaneous acquisition of perfusion oxygenation data and pressure sensor data to ensure correct checker contact ( scanner) with the surface of the target tissue region (18); the processing module (12) being further configured to obtain readings from the array of sensors (46) to obtain positional data from the scanner (16); the processing module (12) is further configured to generate a perfusion oxygenation map of the target tissue region (18) as a function of the acquired position data and perfusion oxygenation data, wherein the perfusion oxygenation map represents the measurement of blood flow and the level of spatial distribution of oxygen and depth of penetration in the entire region of the target tissue (18). [0010] 10. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient, according to Claim 9, characterized in that: the light sources (44) in the sensor assembly (46) comprise one or more LEDs (44) configured to emit light to the target tissue region (18) at a wavelength adjusted for hemoglobin; and wherein the one or more sensors (62) comprise one or more photodiodes (62) configured to detect light reflected from the LEDs (44). [0011] 11. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient, according to Claim 10, characterized in that: each of one or more LEDs (44) comprises dual emitters configured to emit red light (660nm) and infrared (880nm); wherein one or more LEDs (44) are coupled to the driver circuit (100); and wherein the driver circuit (100) is configured to allow the red LED emitter and the infrared LED emitter to be driven independently while sharing a common anode. [0012] 12. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 9, characterized in that it further comprises: a graphical user interface (36); wherein the graphical user interface (36) is configured to display perfusion oxygenation data and pressure sensor data. [0013] 13. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient, according to Claim 12, characterized in that the processing module (12) is further configured to interpolate the positional data to generate the map of oxygenation by perfusion of the target tissue region (18). [0014] System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient, according to Claim 11, characterized in that the processing module (12) is configured to control the sampling of the pressure sensor (50 ) and the sensor set (46) for simultaneous acquisition of two or more data parameters selected from the group consisting of pressure sensor data, perfusion oxygenation data and positional data, to simultaneously display the two or more data parameters . [0015] A System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 13, characterized in that the processing module (12) is configured to receive an image of the target tissue region ( 18) and superimpose the perfusion oxygenation map over the image. [0016] 16. System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of Patient according to Claim 12, characterized in that the graphical user interface (36) is configured to allow user input to manipulate the sensor set (46) and pressure sensor (50) configurations. [0017] A System for Monitoring Oxygenation by Perfusion of Target Tissue Region (18) of a Patient, according to Claim 9, characterized in that the processing module further comprises: a filtering module (22); the filter module (22) configured to filter in-band noise, subtracting the recorded data when one or more light source(s) (44) is (are) in an "off" state from the recorded data, when one or more sources light(s) (44) is(are) in an “on” state. [0018] 18. Method for Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region, (18), of Patient, characterized in that it comprises: positioning a set of sensors (46) in contact with a surface of the target tissue region ( 18); emitting light from light sources (44) in the sensor array (46) in the target tissue region (18) at a wavelength adjusted to hemoglobin; receiving light reflected from light sources (44); obtaining pressure data associated with the sensor array contacting a surface of the target tissue region (18) of a pressure sensor (50); obtain perfusion oxygenation data associated with the target tissue region (18); sample perfusion oxygenation data and pressure data to ensure correct contact of the sensor array (46) with the surface of the target tissue region (18), obtain readings from the sensor array (46) to obtain position data the sensor array (46); and generating a perfusion oxygenation map of the target tissue region (18) as a function of the acquired position data and perfusion oxygenation data, wherein the perfusion oxygenation map represents a measure of blood flow and distribution level oxygen spatial and depth penetration throughout the target tissue region (18). [0019] 19. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 18, characterized in that: the light sources (44) comprise one or more LEDs( s) (44) configured to emit light in the target tissue region (18) at a wavelength adjusted to hemoglobin; and wherein the sensor array (46) comprises one or more photodiode(s) (62) configured to detect light reflected from the LEDs (44). [0020] 20. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of a Patient Target Tissue Region (18), according to Claim 19, characterized in that: each of one or more LEDs (44) comprises dual emitters configured to emit red (660nm) and infrared (880nm) light; the method further comprising independently driving the red LED emitter and the infrared LED emitter while the red LED emitter and the infrared LED emitter share a common anode. [0021] 21. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 18, characterized in that it further comprises: simultaneously displaying the perfusion oxygenation data and the data of the pressure sensor. [0022] 22. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 21, characterized in that it further comprises: interpolating the positional data to generate an oxygenation map by perfusion of the target tissue region (18). [0023] 23. Method for Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of Patient, according to Claim 22, characterized in that the interpolation of positional data comprises the application of a Kriging algorithm on the data acquired positionals. [0024] 24. Method for Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18) of the Patient, according to Claim 22, characterized in that it further comprises: the sampling of the pressure sensor (50) and the sensor array (46) for simultaneous acquisition of pressure sensor data, perfusion oxygenation data, and positional data; and simultaneous display of pressure sensor data, perfusion oxygenation data, and positional data. [0025] 25. Method for Performing Real-Time Monitoring of Target Tissue Region Perfusion Oxygenation (18) of a Patient according to Claim 22, characterized in that it further comprises: receiving an image of the target tissue region (18) ; and superimpose the perfusion oxygenation map over the image. [0026] 26. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 18, characterized in that it further comprises: providing a graphical user interface (36) to allow user input; and manipulating the sampling settings of the sensor array (46) and pressure sensor (50) in accordance with said user input. [0027] 27. Method for Performing Real-Time Monitoring of Oxygenation by Perfusion of Target Tissue Region (18), of a Patient, according to Claim 18, characterized in that it further comprises: regulating one or more light sources (44) between a first period when one or more light sources (44) are on and a second period when one or more light sources (44) are off; and filtering in-band noise by subtracting recorded data when one or more light sources (44) are in an "off" state from when one or more light sources (44) are in an "on" state.
类似技术:
公开号 | 公开日 | 专利标题 BR112013018023B1|2021-09-08|APPARATUS AND SYSTEM FOR MONITORING AND METHOD FOR PERFORMING REAL-TIME MONITORING OF OXYGENATION BY PERFUSION OF PATIENT TARGET TISSUE REGION JP4689842B2|2011-05-25|Method and circuit for indicating the quality and accuracy of physiological measurements US20180317781A1|2018-11-08|Methods and systems for detecting physiology for monitoring cardiac health US20120165629A1|2012-06-28|Systems and methods of monitoring a patient through frequency-domain photo migration spectroscopy US20140221847A1|2014-08-07|System and method for determining vital sign information of a subject US20160120411A1|2016-05-05|Systems and methods for detecting pulse wave velocity US9517019B2|2016-12-13|Physiology measurement device and system, and operating method of wireless single chip Von Chong et al.2019|Towards a novel single-LED pulse oximeter based on a multispectral sensor for IoT applications US20090326347A1|2009-12-31|Synchronous Light Detection Utilizing CMOS/CCD Sensors For Oximetry Sensing Ali et al.2020|Design of internet of things | and android based low cost health monitoring embedded system wearable sensor for measuring SpO2, heart rate and body temperature simultaneously US20080221426A1|2008-09-11|Methods and apparatus for detecting misapplied optical sensors CN103610468A|2014-03-05|Blood oxygen blood volume absolute amount detection device and method thereof Nasr et al.2019|Pulse oximetry US20190108650A1|2019-04-11|Method, device and system for enabling to analyze a property of a vital sign detector US20140275882A1|2014-09-18|Methods and Systems for Determining a Probe-Off Condition in a Medical Device Yu et al.2016|Research on Continuous Vital Signs Monitoring Based on WBAN JP2018534020A|2018-11-22|Physiological monitoring kit with USB drive US10849538B1|2020-12-01|Sensor verification through forward voltage measurements CN106999115A|2017-08-01|The equipment, system and method for the concentration of the material in blood for determining object US20200163602A1|2020-05-28|Apparatus and Methods for Infant Monitoring Naeem et al.2021|Design and Development of a Low Cost Pulse Oximeter WO2021216908A1|2021-10-28|Sensor characterization through forward voltage measurements Priyadharshini et al.2015|Pulse Sensor for Diagnosis and Analysis of Heart Rate Using Peak Detection Technique US20170027486A1|2017-02-02|System & method for estimating substance concentrations in bodily fluids CN107564482A|2018-01-09|Light source measurement display device based on LM3915
同族专利:
公开号 | 公开日 WO2012100090A3|2012-09-13| CN103327894A|2013-09-25| KR101786159B1|2017-10-17| HK1187515A1|2014-04-11| EP2665417A2|2013-11-27| WO2012100090A2|2012-07-26| CA2825167C|2019-01-15| US20170224261A1|2017-08-10| AU2012207287B2|2015-12-17| EP2665417A4|2015-12-02| KR20140038931A|2014-03-31| JP2017029761A|2017-02-09| US20190200907A1|2019-07-04| CN105877764A|2016-08-24| AU2012207287A1|2013-07-18| US20140024905A1|2014-01-23| CN103327894B|2016-05-04| JP2014507985A|2014-04-03| BR112013018023A2|2019-12-17| SG191880A1|2013-08-30| CA2825167A1|2012-07-26| JP6014605B2|2016-10-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5370114A|1992-03-12|1994-12-06|Wong; Jacob Y.|Non-invasive blood chemistry measurement by stimulated infrared relaxation emission| US5818985A|1995-12-20|1998-10-06|Nellcor Puritan Bennett Incorporated|Optical oximeter probe adapter| US5995882A|1997-02-12|1999-11-30|Patterson; Mark R.|Modular autonomous underwater vehicle system| JP4214324B2|1997-08-20|2009-01-28|アークレイ株式会社|Biological tissue measurement device| DE69941975D1|1998-02-13|2010-03-18|Non Invasive Technology Inc|INVESTIGATION, OBSERVATION AND IMAGE DISPLAY OF ABDOMINAL TISSUE| AT413327B|1999-12-23|2006-02-15|Rafolt Dietmar Dipl Ing Dr|HYBRID SENSORS FOR THE SUPPRESSION OF MOTION FACTORS IN THE MEASUREMENT OF BIOMEDICAL SIGNALS| US6510331B1|2000-06-05|2003-01-21|Glenn Williams|Switching device for multi-sensor array| US6606510B2|2000-08-31|2003-08-12|Mallinckrodt Inc.|Oximeter sensor with digital memory encoding patient data| US6591122B2|2001-03-16|2003-07-08|Nellcor Puritan Bennett Incorporated|Device and method for monitoring body fluid and electrolyte disorders| US6606509B2|2001-03-16|2003-08-12|Nellcor Puritan Bennett Incorporated|Method and apparatus for improving the accuracy of noninvasive hematocrit measurements| JP3767449B2|2001-10-05|2006-04-19|株式会社島津製作所|Non-invasive living body measurement apparatus and blood glucose measurement apparatus using the apparatus| JP4551998B2|2003-04-23|2010-09-29|オータックス株式会社|Optical probe and measurement system using the same| FR2856170B1|2003-06-10|2005-08-26|Biospace Instr|RADIOGRAPHIC IMAGING METHOD FOR THREE-DIMENSIONAL RECONSTRUCTION, DEVICE AND COMPUTER PROGRAM FOR IMPLEMENTING SAID METHOD| JP4345459B2|2003-12-01|2009-10-14|株式会社デンソー|Biological condition detection device| CN100450437C|2005-03-10|2009-01-14|深圳迈瑞生物医疗电子股份有限公司|Method of measuring blood oxygen under low filling| US7483731B2|2005-09-30|2009-01-27|Nellcor Puritan Bennett Llc|Medical sensor and technique for using the same| WO2007067927A2|2005-12-06|2007-06-14|Optho Systems, Inc.|Intra-operative ocular parameter sensing| US20070270673A1|2005-12-06|2007-11-22|Abrams Daniel J|Ocular parameter sensing for cerebral perfusion monitoring and other applications| US8116852B2|2006-09-29|2012-02-14|Nellcor Puritan Bennett Llc|System and method for detection of skin wounds and compartment syndromes| JP2008237775A|2007-03-28|2008-10-09|Toshiba Corp|Blood component measuring apparatus| US20100256461A1|2007-05-01|2010-10-07|Urodynamix Technologies Ltd.|Apparatus and methods for evaluating physiological conditions of tissue| JP2010532699A|2007-07-06|2010-10-14|インダストリアルリサーチリミテッド|Laser speckle imaging system and method| US8352004B2|2007-12-21|2013-01-08|Covidien Lp|Medical sensor and technique for using the same| EP2248123B1|2008-02-04|2016-04-13|Koninklijke Philips N.V.|Lighting system, light element and display| WO2009117603A2|2008-03-19|2009-09-24|Hypermed, Inc.|Miniaturized multi-spectral imager for real-time tissue oxygenation measurement| US8750954B2|2008-03-31|2014-06-10|Covidien Lp|Medical monitoring patch device and methods| US20100049007A1|2008-08-20|2010-02-25|Sterling Bernhard B|Integrated physiological sensor apparatus and system| US8364220B2|2008-09-25|2013-01-29|Covidien Lp|Medical sensor and technique for using the same| US20100100160A1|2008-10-16|2010-04-22|Philometron, Inc.|Methods and devices for self adjusting phototherapeutic intervention| US20100105997A1|2008-10-29|2010-04-29|Medtronic, Inc.|Closed loop parameter adjustment for sensor modules of an implantable medical device| JP2010194306A|2009-02-02|2010-09-09|Fukuda Denshi Co Ltd|Home oxygen therapy management device, biological information measuring device and device for acquiring information on operation|PE20150544A1|2012-08-10|2015-05-06|Vioptix Inc|TISSUE OXIMETRY DEVICE, MANUAL, WIRELESS| CN103417221B|2012-05-18|2015-08-19|财团法人工业技术研究院|Blood parameters measuring device and blood parameters measuring method| CN104248421B|2014-09-24|2016-06-01|中国科学院电子学研究所|A kind of reflective photoelectric sensor for gingival blood flow monitoring and its preparation method| US10004408B2|2014-12-03|2018-06-26|Rethink Medical, Inc.|Methods and systems for detecting physiology for monitoring cardiac health| CN104771255B|2015-01-06|2017-06-06|苏州大学|The implementation method of motor pattern is recognized based on cortex hemoglobin information| US20160345846A1|2015-06-01|2016-12-01|Arizona Board Of Regents On Behalf Of Arizona State University|Wearable Biomedical Devices Manufactured with Flexible Flat Panel Display Technology| WO2017031665A1|2015-08-24|2017-03-02|深圳还是威健康科技有限公司|Method and apparatus for detecting heart rate by means of photoelectric reflection| GB201602875D0|2016-02-18|2016-04-06|Leman Micro Devices Sa|Personal hand-held monitor| KR20170101369A|2016-02-26|2017-09-06|삼성디스플레이 주식회사|Photosensitive thin film device and apparatus for sensing biometric information including the same| CN110573066A|2017-03-02|2019-12-13|光谱Md公司|Machine learning systems and techniques for multi-spectral amputation site analysis| SE542896C2|2018-03-28|2020-08-18|Pusensor Ab|A system and a control element for assessment of blood flow for pressure ulcer risk assessment| GB201809007D0|2018-06-01|2018-07-18|Smith & Nephew|Restriction of sensor-monitored region for sensor-enabled wound dressings| US10783632B2|2018-12-14|2020-09-22|Spectral Md, Inc.|Machine learning systems and method for assessment, healing prediction, and treatment of wounds| US10740884B2|2018-12-14|2020-08-11|Spectral Md, Inc.|System and method for high precision multi-aperture spectral imaging| CN111657875B|2020-07-09|2021-01-29|深圳市则成电子股份有限公司|Blood oxygen testing method, device and storage medium thereof|
法律状态:
2019-12-31| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-08-04| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-12-22| B07A| Application suspended after technical examination (opinion) [chapter 7.1 patent gazette]| 2021-03-30| B06A| Patent application procedure suspended [chapter 6.1 patent gazette]| 2021-06-15| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-07-27| B09W| Correction of the decision to grant [chapter 9.1.4 patent gazette]|Free format text: O PRESENTE PEDIDO TEVE UM PARECER DE DEFERIMENTO NOTIFICADO NA RPI NO 2632 DE15/06/2021, TENDO SIDO CONSTATADO QUE ESTA NOTIFICACAO POSSUI ERRO NA INDICACAO DOS DESENHOS.DIANTE DISTO, CONCLUO PELA RETIFICACAO DO PARECER DE DEFERIMENTO, DEVENDO INTEGRAR A CARTAPATENTE OS DOCUMENTOS QUE CONSTAM NO QUADRO 1 DESTE PARECER. | 2021-09-08| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 19/01/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201161434014P| true| 2011-01-19|2011-01-19| US61/434,014|2011-01-19| PCT/US2012/021919|WO2012100090A2|2011-01-19|2012-01-19|Apparatus, systems, and methods for tissue oximetry and perfusion imaging| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|